全文获取类型
收费全文 | 4225篇 |
免费 | 485篇 |
国内免费 | 472篇 |
专业分类
测绘学 | 1856篇 |
大气科学 | 342篇 |
地球物理 | 706篇 |
地质学 | 1002篇 |
海洋学 | 436篇 |
天文学 | 91篇 |
综合类 | 449篇 |
自然地理 | 300篇 |
出版年
2024年 | 7篇 |
2023年 | 28篇 |
2022年 | 114篇 |
2021年 | 158篇 |
2020年 | 200篇 |
2019年 | 159篇 |
2018年 | 118篇 |
2017年 | 239篇 |
2016年 | 243篇 |
2015年 | 212篇 |
2014年 | 295篇 |
2013年 | 304篇 |
2012年 | 358篇 |
2011年 | 323篇 |
2010年 | 199篇 |
2009年 | 202篇 |
2008年 | 202篇 |
2007年 | 238篇 |
2006年 | 203篇 |
2005年 | 195篇 |
2004年 | 150篇 |
2003年 | 136篇 |
2002年 | 105篇 |
2001年 | 118篇 |
2000年 | 93篇 |
1999年 | 80篇 |
1998年 | 63篇 |
1997年 | 70篇 |
1996年 | 67篇 |
1995年 | 56篇 |
1994年 | 54篇 |
1993年 | 45篇 |
1992年 | 37篇 |
1991年 | 14篇 |
1990年 | 15篇 |
1989年 | 20篇 |
1988年 | 19篇 |
1987年 | 15篇 |
1986年 | 11篇 |
1985年 | 6篇 |
1984年 | 2篇 |
1983年 | 2篇 |
1982年 | 1篇 |
1979年 | 3篇 |
1977年 | 2篇 |
1954年 | 1篇 |
排序方式: 共有5182条查询结果,搜索用时 15 毫秒
91.
利用位错理论模型将遗传算法运用于断层三维滑动参数反演问题的求解,采用川西地区2004—2007年GPS观测数据对龙门山断裂带主要断层的三维滑动速率进行计算分析。结果表明:反演结果在量值上整体较小,与地质结果具有较好的一致性,走滑分量︱U1︱〈3.2 mm/a,倾滑分量︱U2︱〈1.54 mm/a,张开分量︱U3︱〈2.5mm/a,低滑动速率反映断层的闭锁及应力的积累及大地震发生的危险性;局部分量较地质结果偏大,反映实测GPS数据反演结果体现的是断层实时活动状态;遗传算法的全局收敛、不依赖初值等优点使结果更加稳定,而张开分量显示局部不规律性,表现出各子断层空间活动不均匀性。 相似文献
92.
93.
为了开发新能源,我队对辽宁省分水地区使用高密度电阻率法进行地下热水资源的探测,对这种测量系统进行了深入研究。并通过该地区的工程实例总结出该工作方法的技术先进性,在工程地质物探中有着较高的可行性和推广价值。 相似文献
94.
95.
协同克立格法,同时兼具化探数据多元性及克立格法表征空间属性的特点。考虑到地质变量两个以上的空间属性,在化探数据处理中运用协同克立格法,可以进一步提高估计精度。运用协同克立格法对广西林旺矿区中金元素成矿进行预测,经相关性分析显示,在矿区中元素的基本组合是Au、Ag、As、Hg,协同克立格插值以Au为主要变量,伴生元素Ag、As、Hg为次要变量。把插值计算后的结果与传统多元统计方法和普通克立格法的计算结果进行比较,结果协同克立格法得到的估值误差较小,预测精度较高,在成矿元素的预测中具有一定程度的优越性。 相似文献
96.
四川盆地蜀南地区下二叠统茅口组缝洞型储层具有强烈的非均质性特征,而且埋藏相对较深,信噪比和分辨率较低,地震预测的难度大。这里通过模型正演,分析不同道距二维地震资料对茅口组单缝洞体储层的分辨能力,并且在资料处理中,研究了高密度地震资料对抑制空间假频,以及提高横向分辨率的影响,最后通过实际资料对比和速度反演,进一步分析证实了高密度采集地震资料,能够有效提高缝洞型储层的预测精度。 相似文献
97.
随着GPS技术的飞速前进,利用网络RTK技术建立的连续运行卫星定位系统已成为GPS应用的发展热点之一。通过对胜利油田单基站CORS系统的组成、功能及在胜利油区石油天然气井位测量中的应用方式及精度分析的论述,对单基站CORS系统在石油天然气井位测量中的现实意义和应用前景进行了展望。 相似文献
98.
99.
Katy A. Boon Peter Rostron Michael H. Ramsey 《Geostandards and Geoanalytical Research》2011,35(3):353-367
In the assessment of potentially contaminated land, the number of samples and the uncertainty of the measurements (including that from sampling) are both important factors in the planning and implementation of an investigation. Both parameters also effect the interpretation of the measurements produced, and the process of making decisions based upon those measurements. However, despite their importance, previously there has been no method for assessing if an investigation is fit‐for‐purpose with respect to both of these parameters. The Whole Site Optimised Contaminated Land Investigation (WSOCLI) method has been developed to address this issue, and to allow the optimisation of an investigation with respect to both the number of samples and the measurement uncertainty, using an economic loss function. This function was developed to calculate an ‘expectation of (financial) loss’, incorporating costs of the investigation itself, subsequent land remediation, and potential consequential costs. To allow the evaluation of the WSOCLI method a computer program ‘OCLISIM’ has been developed to produce sample data from simulated contaminated land investigations. One advantage of such an approach is that as the ‘true’ contaminant concentrations are created by the program, these values are known, which is not the case in a real contaminated land investigation. This enables direct comparisons between functions of the ‘true’ concentrations and functions of the simulated measurements. A second advantage of simulation for this purpose is that the WSOCLI method can be tested on many different patterns and intensities of contamination. The WSOCLI method performed particularly well at high sampling densities producing expectations of financial loss that approximated to the true costs, which were also calculated by the program. WSOCLI was shown to produce notable trends in the relationship between the overall cost (i.e., expectation of loss) and both the number of samples and the measurement uncertainty, which are: (a) low measurement uncertainty was optimal when the decision threshold was between the mean background and the mean hot spot concentrations. (b) When the hot spot mean concentration is equal to or near the decision threshold, then mid‐range measurement uncertainties were optimal. (c) When the decision threshold exceeds the mean of the hot spot, mid‐range measurement uncertainties were optimal. The trends indicate that the uncertainty may continue to rise if the difference between hot spot mean and the decision threshold increases further. (d) In any of the above scenarios, the optimal measurement uncertainty was lower if there is a large geochemical variance (i.e., heterogeneity) within the hot spot. (e) The optimal number of samples for each scenario was indicated by the WSOCLI method, and was between 50 and 100 for the scenarios considered generally; although there was significant noise in the predictions, which needs to be addressed in future work to allow such conclusions to be clearer. 相似文献
100.